Tweet
Login
Mathematics Crystal
You may switch between
tex
and
pdf
by changing the end of the URL.
Home
About Us
Materials
Site Map
Questions and Answers
Skills
Topic Notes
HSC
Integration
Others
Tangent
UBC
UNSW
Calculus Advanced
Challenges
Complex Numbers
Conics
Differentiation
Integration
Linear Algebra
Mathematical Induction
Motion
Others
Polynomial Functions
Probability
Sequences and Series
Trigonometry
/
Topics /
Linear Algebra /
Eigenvalues.tex
--Quick Links--
The Number Empire
Wolfram Mathematica online integrator
FooPlot
Calc Matthen
Walter Zorn
Quick Math
Lists of integrals
List of integrals of trigonometric functions
PDF
\documentclass[10pt]{article} \usepackage{amssymb,amsmath} \usepackage[hmargin=1cm,vmargin=1cm]{geometry} \begin{document} {\large Eigenvalues, Eigenvectors and Eigenspace}\\ \begin{align*} \text{\bf Find}&\text{\bf ing Nemo:}\\ &\text{For a matrix $A$, if there exists a scalar $\lambda$ and a vector $\mathbf{v}\neq 0$, such that $\boxed{A\mathbf{v}=\lambda\mathbf{v}}$~, then we call $\lambda$ an eigenvalue and}\\ &\text{$\mathbf{v}$ an eigenvector of $A$. It follows that}\quad A\mathbf{v}-\lambda\mathbf{v}=(A-\lambda I)\mathbf{v}=0.\quad\therefore\boxed{\det(A-\lambda I)=0}.\quad\text{It means that applying}\\ &\text{$A$ to vector $\mathbf{v}$ does not change its direction but scales it by a factor of $\lambda$ (the eigenvalue). Not all matrices have}\\ &\text{eigenvalues (and therefore eigenvectors). When one does, the collection of all its eigenvectors is called the eigenspace.}\\ \\ &\text{e.g. The eigenspace of a flip matrix is the line or plane it flips over. If it does not shear in other directions, the axis it}\\ &\text{flips along is an eigenspace as well, with a negative eigenvalue. There are no eigenvalues for a rotation matrix as no vectors}\\ &\text{point to the same direction afterwards. (Product of two matrices with eigenvalues do not necessarily have eigenvalues.)}\\ \\ &\text{For diagonal matrices, the eigenvalues are on the diagonal. If all eigenvalues are distinct, the eigenspace is the axes (less 0).}\\ &\text{e.g. A $3\times 3$ diagonal matrix}~~D= \begin{bmatrix} \lambda_1 & 0 & 0 \\ 0 & \lambda_2 & 0 \\ 0 & 0 & \lambda_3 \end{bmatrix} \text{has eigenvalues $\lambda_1, \lambda_2$ and $\lambda_3$, so}~~ D\mathbf{i}=\lambda_1\mathbf{i},~~ D\mathbf{j}=\lambda_2\mathbf{j},\text{ and } D\mathbf{k}=\lambda_3\mathbf{k}.\\ &\text{That means the eigenspace is the union of $a_x\mathbf{i}, a_y\mathbf{j},$ and $a_z\mathbf{k}$, where $a_x, a_y, a_z\in\mathbb{R}-\{0\}$. If there are identical eigenvalues,}\\ &\text{e.g. $\lambda_1=\lambda_2$, any vectors on the $xy$-plane are eigenvectors as they are sheared equally on both axises. Generally, if}\\ &\text{there are $r$ identical eigenvalues, their corresponding eigenspace will be of $r$-dimension. If all eigenvalues are identical,}\\ &\text{the matrix becomes $\lambda I$ and its eigenspace will be $\mathbb{R}^n$.}\\ \\ &\text{Similarly, for a triangular matrix $A$, $\det(A-\lambda I)=\prod_{k=1}^n(a_{kk}-\lambda)=0$, so all eigenvalues are on the diagonal as well. The}\\ &\text{eigenspace are not the axises, but it is still true that $r$ identical eigenvalues correspond to an eigenspace of $r$-dimension.}\\ &\text{(The above analysis implies that an $n\times n$ matrix has at most $n$ distinct eigenvalues.)}\\ \\ &\text{Row and column operations do not change the eigenvalues (even though the eigenvectors would change), one can reduce a}\\ &\text{matrix into a triangular one and find the eigenvalues, then substitute each eigenvalues into $A-\lambda I$ to solve for the}\\ &\text{eigenvectors.}\\ \\ \end{align*} % % % \begin{align*} \text{\bf Char}&\text{\bf acteristic polynomial:}\\ &\text{If matrix $A$ has distinct eigenvalues $\lambda_r (r=1,2,\ldots n$), it will have $r$ independent eigenvectors $\mathbf{v_r}$.}\quad\because~A\mathbf{v_r}=\lambda_r\mathbf{v_r},\\ &A\big[\mathbf{v_1}\mathbf{v_2}\ldots\mathbf{v_n}\big]=\big[A\mathbf{v_1}~A\mathbf{v_2}\ldots A\mathbf{v_n}\big]=\big[(\lambda_1\mathbf{v_1})(\lambda_2\mathbf{v_2})\ldots(\lambda_n\mathbf{v_n})\big]=\big[\mathbf{v_1}\mathbf{v_2}\ldots\mathbf{v_n}\big]D,\quad \text{where }D= \begin{bmatrix} \lambda_1 & 0 & \ldots & 0 \\ 0 & \lambda_2 & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & \lambda_n \end{bmatrix} \\ &\therefore~AP=PD,\quad\text{where }P=\big[\mathbf{v_1}\mathbf{v_2}\ldots\mathbf{v_n}\big].\quad \text{As $\mathbf{v_r}$ are distinct, $P$ is invertable, so $A=PDP^{-1}$.}\\ \\ &\text{Since }\det(AP)=\det(PD),\quad\det(A)\det(P)=\det(P)\det(D),\quad \boxed{\det(A)=\det(D)=\prod_{k=1}^n\lambda_k.}\quad\text{If $A$ is invertible, so is $D$.}\\ \\ &\text{We will also have }\boxed{A^n=PD^nP^{-1}} \text{ as }A^n =PDP^{-1}\cdot PDP^{-1}\ldots PDP^{-1} =PD(P^{-1}P)D(P^{-1}P)\ldots DP^{-1} =PD^nP^{-1}.\\ \\ &\text{For any polynomial }g(A)=\sum_{k=0}^n c_k A^k=\sum_{k=0}^n c_k(PD^kP^{-1})=P\left(\sum_{k=0}^n c_k D^k\right)P^{-1}=P~g(D)~P^{-1},~\boxed{g(A)=P~g(D)~P^{-1}.}\\ \\ &D^k= \begin{bmatrix} \lambda_1^k & 0 & \ldots & 0 \\ 0 & \lambda_2^k & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & \lambda_n^k \end{bmatrix},\quad g(D)=\sum_{k=0}^n c_k D^k= \begin{bmatrix} \sum c_k \lambda_1^k & 0 & \ldots & 0 \\ 0 & \sum c_k \lambda_2^k & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & \sum c_k \lambda_n^k \end{bmatrix} = \begin{bmatrix} g(\lambda_1) & 0 & \ldots & 0 \\ 0 & g(\lambda_2) & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & g(\lambda_n) \end{bmatrix}\\ \\ &\text{Let }f(\lambda)= \det(A-\lambda I)= \begin{vmatrix} a_{11} - \lambda & a_{12} & \ldots & a_{1n} \\ a_{21} & a_{22} - \lambda & \ldots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \ldots & a_{nn} - \lambda \end{vmatrix} =(a_{11} - \lambda)(a_{12} - \lambda)\ldots(a_{nn} - \lambda)+\ldots\text{(various terms)}\ldots\\ \\ &\text{As }f(A)=\det(A-AI)=0,~\boxed{f(A)=P~f(D)~P^{-1}=0}\quad\text{where $f(\lambda)= \det(A-\lambda I)$.\quad(Cayley-Hamilton Theorem)}\\ \\ &\text{The function $f(x)=\det(A-xI)$ is called the characteristic polynomial of matrix $A$. The eigenvalues of $A$ are the}\\ &\text{roots of its characteristic polynomial $f(x)$. In other words, given a transformation matrix, one may find its}\\ &\text{eigenvalues by solving the characteristic polynomial. Like determinants provide a ``connection'' between matrices and}\\ &\text{scalars, eigenvalues connect matrices and polynomials.}\\ \\ &\text{Note that this idea of ``connections'' is not a mathematical concept but helps you think. For example, the fact that}\\ &\text{$\det(AB)=\det(A)\cdot\det(B)\quad\text{and}\quad\det(A^{-1})=\frac{1}{\det(A)}$ would lead to $A$ having no inverse if $\det(A)=0$.}\\ \end{align*} \end{document}